Deep Rotation Equivariant Network

نویسندگان

  • Junying Li
  • Zichen Yang
  • Haifeng Liu
  • Deng Cai
چکیده

Recently, learning equivariant representations has attracted considerable research attention. Dieleman et al. introduce four operations which can be inserted into convolutional neural network to learn deep representations equivariant to rotation. However, feature maps should be copied and rotated four times in each layer in their approach, which causes much running time and memory overhead. In order to address this problem, we propose Deep Rotation Equivariant Network consisting of cycle layers, isotonic layers and decycle layers. Our proposed layers apply rotation transformation on filters rather than feature maps, achieving a speed up of more than 2 times with even less memory overhead. We evaluate DRENs on Rotated MNIST and CIFAR-10 datasets and demonstrate that it can improve the performance of state-of-the-art architectures.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Robustness of Rotation-Equivariant Networks to Adversarial Perturbations

Deep neural networks have been shown to be vulnerable to adversarial examples: very small perturbations of the input having a dramatic impact on the predictions. A wealth of adversarial attacks and distance metrics to quantify the similarity between natural and adversarial images have been proposed, recently enlarging the scope of adversarial examples with geometric transformations beyond pixel...

متن کامل

Transformation Equivariant Boltzmann Machines

We develop a novel modeling framework for Boltzmann machines, augmenting each hidden unit with a latent transformation assignment variable which describes the selection of the transformed view of the canonical connection weights associated with the unit. This enables the inferences of the model to transform in response to transformed input data in a stable and predictable way, and avoids learni...

متن کامل

Polar Transformer Networks

Convolutional neural networks (CNNs) are inherently equivariant to translation. Efforts to embed other forms of equivariance have concentrated solely on rotation. We expand the notion of equivariance in CNNs through the Polar Transformer Network (PTN). PTN combines ideas from the Spatial Transformer Network (STN) and canonical coordinate representations. The result is a network invariant to tra...

متن کامل

Learning Steerable Filters for Rotation Equivariant CNNs

In many machine learning tasks it is desirable that a model’s prediction transforms in an equivariant way under transformations of its input. Convolutional neural networks (CNNs) implement translational equivariance by construction; for other transformations, however, they are compelled to learn the proper mapping. In this work, we develop Steerable Filter CNNs (SFCNNs) which achieve joint equi...

متن کامل

2 5 Ju l 2 00 7 EQUIVARIANT SATAKE CATEGORY AND KOSTANT - WHITTAKER REDUCTION

We explain (following V. Drinfeld) how the G(C[[t]]) equivariant derived category of the affine Grassmannian can be described in terms of coherent sheaves on the Langlands dual Lie algebra equivariant with respect to the adjoint action, due to some old results of V. Ginzburg. The global cohomology functor corresponds under this identification to restriction to the Kostant slice. We extend this ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • CoRR

دوره abs/1705.08623  شماره 

صفحات  -

تاریخ انتشار 2017